Web Survey Bibliography
In Web surveys, rating scales measuring the respondents’ attitudes and self-descriptions by means of a series of related statements are commonly presented in grid (or matrix) questions. Despite the benefits of displaying multiple rating scale items neatly arranged and supposedly easy to complete on a single screen, respondents are often tempted to rely on cognitive shortcuts in order to reduce the extent of cognitive and navigational effort required to answer a set of rating scale items. In order to minimize this risk of cognitive shortcuts resulting in satisfying rather than optimal answers, respondents have to be motivated to spend extra time and effort on the attentive and careful processing of rating scales. A wide range of visual and dynamic features are available in interactive Web surveys allowing for visual enhancement and greater interactivity in the presentation of survey questions. To date, however, only a few studies have systematically examined new rating scale designs using data input methods other than conventional radio buttons. In the present study, two different rating scales were designed using drag-and-drop as a more interactive data input method: Respondents have to drag the response options towards the rating scale items (‘drag-response’), or in the reverse direction, the rating scale items towards the response options (‘drag-item’). In both drag-and-drop rating scales, the visual highlighting of the items and response options as well as the dynamic strengthening of the link between these key components are aimed at encouraging the respondents to process a rating scale more attentively and carefully. The effectiveness of the drag-and-drop rating scales in preventing the respondents’ susceptibility to cognitive shortcuts is assessed on the basis of five systematic response tendencies that are typically accompanied by rating scales, i.e., careless, nondifferentiated, acquiescent, and extreme responding as well as the respondents’ systematic tendency to select one of the first response options, so called primacy effects. Moreover, item missing data, response times, and respondent evaluation are examined. The findings of the present study revealed that although both drag-and-drop scales entail a higher level of respondent burden as indicated by an increase in item missing data and longer response times compared to conventional radio button scales, they promote the respondents’ attentiveness and carefulness towards the response task which is accompanied by the respondents’ reduced susceptibility to cognitive shortcuts in processing rating scales.
Ratingskalen zu Erfassung von Einstellungen und Persönlichkeitsmerkmalen des Befragten werden in Online-Befragungen bevorzugt in Form einer Matrixfrage dargestellt. Matrixfragen bieten zwar gewisse Vorzüge hinsichtlich der übersichtlichen Darstellung und einer vermeintlich einfachen Bearbeitung mehrerer Items. Gleichzeitig sind sie jedoch auch anfälliger für systematische Antworttendenzen, die zur Verringerung der Datenqualität führen können. Um dem Risiko derartiger Abkürzungsstrategien entgegenzuwirken, müssen die Befragten zur aufmerksamen und sorgfältigen Bearbeitung von Ratingskalen motiviert werden. Online-Befragungen ermöglichen den Einsatz visueller und interaktiver Elemente zur optischen Aufwertung einzelner Fragen und zur Steigung der Interaktivität des Befragungsprozesses insgesamt. Bislang gibt es jedoch nur wenige Studien, die den Einsatz solcher Gestaltungselemente in Ratingskalen untersuchen. Vor diesem Hintergrund werden im Rahmen der vorliegenden Studie zwei unterschiedliche Drag-and-Drop-Ratingskalen konzipiert: In der Drag-Response-Skala sind die Befragten aufgefordert, mit dem Mauszeiger eine ausgewählte Antwortmöglichkeit zum jeweiligen Item zu ziehen, wohingegen in der Drag-Item-Skala das jeweilige Item zur ausgewählten Antwortmöglichkeit gezogen wird. Durch den Einsatz der Drag-and-Drop Technik soll die Aufmerksamkeit gezielt auf die Items und Antwortmöglichkeiten gelenkt sowie die Verbindung zwischen dem jeweiligen Item und der ausgewählten Antwortmöglichkeit verstärkt werden. Zur Überprüfung der Effektivität der beiden Drag-and-Drop-Ratingskalen hinsichtlich einer aufmerksameren und sorgfältigeren Bearbeitung und letztlich einer Vorbeugung von systematischen Antworttendenzen werden mehrere Indikatoren der Datenqualität herangezogen, darunter ‚Careless Responding‘, ‚Nondifferentiation‘, ‚Acquiescence‘, ‚Extremity‘ sowie ‚Primacy Effekte‘. Darüber hinaus werden das Ausmaß fehlender Werte, die Antwortzeiten und Bewertungen der Befragten ausgewertet. Die Ergebnisse der vorliegenden Untersuchung zeigen, dass die Drag-and-Drop-Ratingskalen zwar mit einem gesteigerten Aufwand für Kognition und Navigation einhergehen, welcher zu mehr fehlenden Werten und längeren Antwortzeiten führt. Gleichzeitig jedoch werden die Befragten zu einem aufmerksameren und sorgfältigeren Antwortverhalten motiviert, was wiederum systematischen Antworttendenzen entgegenwirkt.
Web survey bibliography (366)
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Web- and Phone-based Data Collection using Planned Missing Designs; 2017; Revelle, W.; Condon, M. D.; Wilt, J.; French, A. J.; Brown, A.; Elleman, G. L.
- Finding and Investigating Geographical Data Online; 2017; Martin, D.; Cockings, S.; Leung, S.
- CAQDAS at a Crossroads: Affordances of Technology in an Online Environment; 2017; Silver, C.; Bulloch, L. S.
- Artificial Intelligence/Expert Systems and Online Research; 2017; Brent, E.
- Improving the Effectiveness of Online Data Collection by Mixing Survey Modes; 2017; Dillman, D. A.; Hao, F.; Millar, M. M.
- Online Survey Software; 2017; Kaczmirek, L.
- Online Survey Design; 2017; To, N.
- Sampling Methods for Online Surveys; 2017; Fricker, R. D.
- Research Design and Tools for Online Research; 2017; Hewson, C. M.
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Using Visual Analogue Scales in eHealth: Non-Response Effects in a Lifestyle Intervention; 2016; Kuhlmann, T.; Reips, U.-D.; Wienert, J.; Lippke, S.
- A Feasibility Study of Recruiting and Maintaining a Web Panel of People with Disabilities; 2016; Chandler, J.
- Inferences from Internet Panel Studies and Comparisons with Probability Samples; 2016; Lachan, R.; Boyle, J.; Harding, R.
- Exploring the Gig Economy Using a Web-Based Survey: Measuring the Online 'and' Offline Side...; 2016; Robles, B. J.; McGee, M.
- Facebook, Twitter, & Qr codes: An exploratory trial examining the feasibility of social media mechanisms...; 2016; Gu, L. L.; Skierkowski, D.; Florin, P.; Friend, K.; Ye, Y.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- Mixing modes of data collection in Swiss social surveys: Methodological report of the LIVES-FORS mixed...; 2016; Roberts, C.; Joye, D.; Staehli, M. E.
- Representative web-survey!; 2016; Linde, P.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Refining the Web Response Option in the Multiple Mode Collection of the American Community Survey; 2016; Hughes, T.; Tancreto, J.
- The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations; 2016; Sell, R.; Goldberg, S.; Conron, K.
- Comparing online and telephone survey results in the context of a skin cancer prevention campaign evaluation...; 2016; Hollier, L.P.; Pettigrew, S.; Slevin, T.; Strickland, M.; Minto, C.
- Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk; 2016; Berinsky, A.; Huber, G. A.; Lenz, G. S.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- Dropouts in Longitudinal Surveys; 2016; Lugtig, P. J.; De Leeuw, E. D.
- Participant recruitment and data collection through Facebook: the role of personality factors; 2016; Rife, S. C.; Cate, K. L.; Kosinski, M.; Stillwell, D.
- What drives the participation in a monthly research web panel? The experience of ELIPSS, a French random...; 2016; Legleye, S; Cornilleau, A.; Razakamanana, N.
- Quantifying Under- and Overreporting in Surveys Through a Dual-Questioning-Technique Design. ; 2016; de Jong , M.; Fox, J.-P.; Steenkamp, J. - B. E. M.
- Take the money and run? Redemption of a gift card incentive in a clinician survey. ; 2016; Chen, J. S.; Sprague, B. L.; Klabunde, C. N.; Tosteson, A. N. A.; Bitton, A.; Onega, T.; MacLean, C....
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Development of a scale to measure skepticism toward electronic word-of-mouth; 2016; Zhang, Xia.; Ko, M.; Carpenter, D.
- Improving social media measurement in surveys: Avoiding acquiescence bias in Facebook research; 2016; Kuru, O.; Pasek, J.
- Psychological research in the internet age: The quality of web-based data; 2016; Ramsey, S. R.; Thompson, K. L.; McKenzie, M.; Rosenbaum, A.
- Internet Abusive Use Questionnaire: Psychometric properties; 2016; Calvo-Frances, F.
- Revisiting “yes/no” versus “check all that apply”: Results from a mixed modes...; 2016; Nicolaas, G.; Campanelli, P.; Hope, S.; Jaeckle, A.; Lynn, P.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Online and Social Media Data As an Imperfect Continuous Panel Survey; 2016; Diaz, F.; Garmon, F.; Hofman, J. K.; Kiciman, E.; Rothschild, D.